Lecture 13 : Randomized Least - squares Approximation in Practice , Cont .

نویسنده

  • Michael Mahoney
چکیده

The basic idea is that rather than doing a QR decomposition of A, do a QR decomposition on ΠA, where Π is a FJLT (or some other, e.g., data-aware subspace embedding), and then use the R̃−1 from QR on the subproblem as a preconditioner for an iterative algorithm on the original problem. We saw that if ΠA = Q̃R̃ then κ ( AR̃−1 ) = κ (SU). If we sample “enough,” i.e., Ω (d log(d)/ ), then this condition number is ≤ 1 + and the very good subspace embedding provides a very good preconditioner.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lecture 17 : Toward Randomized Low - rank Approximation in Practice

As with the LS problem and algorithms, here we also want to understand how these theoretical ideas for randomized low-rank matrix approximation can be used in practice. As we will see, just as with the LS problem and algorithms, the basic ideas do go through to practical situations, but some of the theory must be modified in certain ways. Among the issues that will come up for the randomized lo...

متن کامل

Lecture 7 : Sampling / Projections for Least - squares Approximation , Cont .

We continue with the disucssion from last time. There is no new reading, just the same as last class. Recall that last time we provided a brief overview of LS problems and a brief overview of sketching methods for LS problems. For the latter, we provided a lemma that showed that under certain conditions the solution of a sketched LS problem was a good approximation to the solution of the origin...

متن کامل

Lecture 12 : Randomized Least - squares Approximation in Practice

Recall that we are interested in the conditioning quality of randomized sketches constructed by RandNLA sampling and projection algorithms. For simplicity of comparison with the Blendenpik paper, I’ll state the results as they are stated in that paper, i.e., with a Hadamard-based projection, and then I’ll point out the generalization (e.g., to leverage score sampling, to other types of projecti...

متن کامل

Lecture 11: Randomized Least-squares Approximation in Practice

There are several issues that must be dealt with to implement these RandNLA ideas in practice. By this, we mean that we will want to implement these algorithms as is done in NLA, e.g., in LAPACK. Different issues arise when we implement these algorithms in a data center or a database or a distributed environment or a parallel shared memory environment; and different issues arise when we impleme...

متن کامل

CS 294 : Randomized Algorithms for Matrices and Data Lecture 6 - 09 / 23 / 2013 Lecture 6 : Sampling / Projections for Least - squares Approximation

In many applications, we want to find an approximate solution to a problem or set of equations that, for noise reasons or whatever other reasons, does not have a solution, or not unrelatedly does not have a unique solution. A canonical example of this is given by the very overconstrained (i.e., overdetermined) least-squares (LS) problem, and this will be our focus for the next several classes. ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015